Noisy Expectation-Maximization: Applications and Generalizations

نویسندگان

  • Osonde Osoba
  • Bart Kosko
چکیده

We present a noise-injected version of the Expectation-Maximization (EM) algorithm: the Noisy Expectation Maximization (NEM) algorithm. The NEM algorithm uses noise to speed up the convergence of the EM algorithm. The NEM theorem shows that injected noise speeds up the average convergence of the EM algorithm to a local maximum of the likelihood surface if a positivity condition holds. The generalized form of the noisy expectation-maximization (NEM) algorithm allow for arbitrary modes of noise injection including adding and multiplying noise to the data. We demonstrate these noise benefits on EM algorithms for the Gaussian mixture model (GMM) with both additive and multiplicative NEM noise injection. A separate theorem (not presented here) shows that the noise benefit for independent identically distributed additive noise decreases with sample size in mixture models. This theorem implies that the noise benefit is most pronounced if the data is sparse. Injecting blind noise only slowed convergence. I. NOISE BOOSTING THE EXPECTATION-MAXIMIZATION ALGORITHM We show how carefully chosen and injected noise can speed convergence of the popular expectation-maximization (EM) algorithm. A general theorem allows arbitrary modes of combining signal and noise to improve the speed of parameter estimation. The result still speeds EM convergence on average at each iteration so long as the injected noise satisfies a positivity condition. The EM algorithm generalizes maximum-likelihood estimation to the case of missing or corrupted data [1], [2]. Maximum likelihood maximizes the conditional signal probability density function (pdf) f(y|θ) for a random signal variable Y given a vector of parameters θ. It equally maximizes the log-likelihood ln f(y|θ) since the logarithm is monotone increasing. So the maximum–likelihood estimate θ∗ is

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimating noise from noisy speech features with a monte carlo variant of the expectation maximization algorithm

In this work, we derive a Monte Carlo expectation maximization algorithm for estimating noise from a noisy utterance. In contrast to earlier approaches, where the distribution of noise was estimated based on a vector Taylor series expansion, we use a combination of importance sampling and Parzen-window density estimation to numerically approximate the occurring integrals with the Monte Carlo me...

متن کامل

On Regularization Methods of Em-kaczmarz Type

We consider regularization methods of Kaczmarz type in connection with the expectation-maximization (EM) algorithm for solving ill-posed equations. For noisy data, our methods are stabilized extensions of the well established ordered-subsets expectation-maximization iteration (OS-EM). We show monotonicity properties of the methods and present a numerical experiment which indicates that the exte...

متن کامل

Multi-channel Image Identiication and Restoration Using the Expectation-maximization Algorithm

Previous work has demonstrated the eeectiveness of the Expectation-Maximization algorithm to restore noisy and blurred single-channel images and simultaneously identify its blur. In addition, a general framework for processing multi-channel images using single-channel techniques has also been developed. This paper combines and extends the two approaches to the simultaneous blur identiication an...

متن کامل

Inverse Reinforcement Learning Under Noisy Observations (Extended Abstract)

We consider the problem of performing inverse reinforcement learning when the trajectory of the expert is not perfectly observed by the learner. Instead, noisy observations of the trajectory are available. We generalize the previous method of expectation-maximization for inverse reinforcement learning, which allows the trajectory of the expert to be partially hidden from the learner, to incorpo...

متن کامل

Learning a Spelling Error Model from Search Query Logs

Applying the noisy channel model to search query spelling correction requires an error model and a language model. Typically, the error model relies on a weighted string edit distance measure. The weights can be learned from pairs of misspelled words and their corrections. This paper investigates using the Expectation Maximization algorithm to learn edit distance weights directly from search qu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1801.04053  شماره 

صفحات  -

تاریخ انتشار 2018